Orthogonal Connectivity Factorization: Interpretable Decomposition of Variability in Correlation Matrices

نویسندگان

  • Aapo Hyvärinen
  • Junichiro Hirayama
  • Vesa Kiviniemi
  • Motoaki Kawanabe
چکیده

In many multivariate time series, the correlation structure is nonstationary, that is, it changes over time. The correlation structure may also change as a function of other cofactors, for example, the identity of the subject in biomedical data. A fundamental approach for the analysis of such data is to estimate the correlation structure (connectivities) separately in short time windows or for different subjects and use existing machine learning methods, such as principal component analysis (PCA), to summarize or visualize the changes in connectivity. However, the visualization of such a straightforward PCA is problematic because the ensuing connectivity patterns are much more complex objects than, say, spatial patterns. Here, we develop a new framework for analyzing variability in connectivities using the PCA approach as the starting point. First, we show how to analyze and visualize the principal components of connectivity matrices by a tailor-made rank-two matrix approximation in which we use the outer product of two orthogonal vectors. This leads to a new kind of transformation of eigenvectors that is particularly suited for this purpose and often enables interpretation of the principal component as connectivity between two groups of variables. Second, we show how to incorporate the orthogonality and the rank-two constraint in the estimation of PCA itself to improve the results. We further provide an interpretation of these methods in terms of estimation of a probabilistic generative model related to blind separation of dependent sources. Experiments on brain imaging data give very promising results.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Characterizing Variability of Modular Brain Connectivity with Constrained Principal Component Analysis

Characterizing the variability of resting-state functional brain connectivity across subjects and/or over time has recently attracted much attention. Principal component analysis (PCA) serves as a fundamental statistical technique for such analyses. However, performing PCA on high-dimensional connectivity matrices yields complicated "eigenconnectivity" patterns, for which systematic interpretat...

متن کامل

NORGES TEKNISK-NATURVITENSKAPELIGE UNIVERSITET Specifying Gaussian Markov Random Fields with Incomplete Orthogonal Factorization using Givens Rotations

In this paper an approach for finding a sparse incomplete Cholesky factor through an incomplete orthogonal factorization with Givens rotations is discussed and applied to Gaussian Markov random fields (GMRFs). The incomplete Cholesky factor obtained from the incomplete orthogonal factorization is usually sparser than the commonly used Cholesky factor obtained through the standard Cholesky facto...

متن کامل

GIFT: Guided and Interpretable Factorization for Tensors - An Application to Large-Scale Multi-platform Cancer Analysis

Motivation: Given multi-platform genome data with prior knowledge of functional gene sets, how can we extract interpretable latent relationships between patients and genes? More specifically, how can we devise a tensor factorization method which produces an interpretable gene factor matrix based on gene set information while maintaining the decomposition quality and speed? Method:WeproposeGIFT,...

متن کامل

New Bases for Polynomial-Based Spaces

Since it is well-known that the Vandermonde matrix is ill-conditioned, while the interpolation itself is not unstable in function space, this paper surveys the choices of other new bases. These bases are data-dependent and are categorized into discretely l2-orthonormal and continuously L2-orthonormal bases. The first one construct a unitary Gramian matrix in the space l2(X) while the late...

متن کامل

An SVD-Like Matrix Decomposition and Its Applications

A matrix S ∈ C2m×2m is symplectic if SJS∗ = J , where J = [ 0 −Im Im 0 ] . Symplectic matrices play an important role in the analysis and numerical solution of matrix problems involving the indefinite inner product x∗(iJ)y. In this paper we provide several matrix factorizations related to symplectic matrices. We introduce a singular value-like decomposition B = QDS−1 for any real matrix B ∈ Rn×...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neural computation

دوره 28 3  شماره 

صفحات  -

تاریخ انتشار 2016